منابع مشابه
Deep Illumination: Approximating Dynamic Global Illumination with Generative Adversarial Network
We present Deep Illumination, a novel machine learning technique for approximating global illumination (GI) in real-time applications using a Conditional Generative Adversarial Network. Our primary focus is on generating indirect illumination and soft shadows with offline rendering quality at interactive rates. Inspired from recent advancement in image-to-image translation problems using deep g...
متن کاملDeep Generative Dual Memory Network for Continual Learning
Despite advances in deep learning, artificial neural networks do not learn the same way as humans do. Today, neural networks can learn multiple tasks when trained on them jointly, but cannot maintain performance on learnt tasks when tasks are presented one at a time – this phenomenon called catastrophic forgetting is a fundamental challenge to overcome before neural networks can learn continual...
متن کاملGenerative Deep Deconvolutional Learning
A generative model is developed for deep (multi-layered) convolutional dictionary learning. A novel probabilistic pooling operation is integrated into the deep model, yielding efficient bottom-up (pretraining) and top-down (refinement) probabilistic learning. After learning the deep convolutional dictionary, testing is implemented via deconvolutional inference. To speed up this inference, a new...
متن کاملLearning Deep Generative Models
Building intelligent systems that are capable of extracting high-level representations from high-dimensional sensory data lies at the core of solving many artificial intelligence–related tasks, including object recognition, speech perception, and language understanding. Theoretical and biological arguments strongly suggest that building such systems requires models with deep architectures that ...
متن کاملAuxiliary Deep Generative Models
Deep generative models parameterized by neural networks have recently achieved state-ofthe-art performance in unsupervised and semisupervised learning. We extend deep generative models with auxiliary variables which improves the variational approximation. The auxiliary variables leave the generative model unchanged but make the variational distribution more expressive. Inspired by the structure...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Pattern Analysis and Machine Intelligence
سال: 2020
ISSN: 0162-8828,2160-9292,1939-3539
DOI: 10.1109/tpami.2020.3032286